Multi-Task Feature Learning Via Efficient l2, 1-Norm Minimization
نویسندگان
چکیده
The problem of joint feature selection across a group of related tasks has applications in many areas including biomedical informatics and computer vision. We consider the 2,1-norm regularized regression model for joint feature selection from multiple tasks, which can be derived in the probabilistic framework by assuming a suitable prior from the exponential family. One appealing feature of the 2,1-norm regularization is that it encourages multiple predictors to share similar sparsity patterns. However, the resulting optimization problem is challenging to solve due to the non-smoothness of the 2,1-norm regularization. In this paper, we propose to accelerate the computation by reformulating it as two equivalent smooth convex optimization problems which are then solved via the Nesterov’s method—an optimal first-order black-box method for smooth convex optimization. A key building block in solving the reformulations is the Euclidean projection. We show that the Euclidean projection for the first reformulation can be analytically computed, while the Euclidean projection for the second one can be computed in linear time. Empirical evaluations on several data sets verify the efficiency of the proposed algorithms.
منابع مشابه
Exclusive Sparsity Norm Minimization with Random Groups via Cone Projection
Many practical applications such as gene expression analysis, multi-task learning, image recognition, signal processing, and medical data analysis pursue a sparse solution for the feature selection purpose and particularly favor the nonzeros evenly distributed in different groups. The exclusive sparsity norm has been widely used to serve to this purpose. However, it still lacks systematical stu...
متن کاملl2, 1 Regularized correntropy for robust feature selection
In this paper, we study the problem of robust feature extraction based on l2,1 regularized correntropy in both theoretical and algorithmic manner. In theoretical part, we point out that an l2,1-norm minimization can be justified from the viewpoint of half-quadratic (HQ) optimization, which facilitates convergence study and algorithmic development. In particular, a general formulation is accordi...
متن کاملMulti-Task Feature Learning Via Efficient 2,1-Norm Minimization
The problem of joint feature selection across a group of related tasks has applications in many areas including biomedical informatics and computer vision. We consider the 2,1-norm regularized regression model for joint feature selection from multiple tasks, which can be derived in the probabilistic framework by assuming a suitable prior from the exponential family. One appealing feature of the...
متن کاملJoint adaptive loss and l2/l0-norm minimization for unsupervised feature selection
Unsupervised feature selection is a useful tool for reducing the complexity and improving the generalization performance of data mining tasks. In this paper, we propose an Adaptive Unsupervised Feature Selection (AUFS) algorithm with explicit l2/l0-norm minimization. We use a joint adaptive loss for data fitting and a l2/l0 minimization for feature selection. We solve the optimization problem w...
متن کاملInexact Accelerated Proximal Gradient Algorithms For Matrix l2,1-Norm Minimization Problem in Multi-Task Feature Learning
In this paper, we extend the implementable APG method to solve the matrix l2,1-norm minimization problem arising in multi-task feature learning. We investigate that the resulting inner subproblem has closed-form solution which can be easily determined by taking the problem’s favorable structures. Under suitable conditions, we can establish a comprehensive convergence result for the proposed met...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009